4 research outputs found

    Classification of Minimal Separating Sets in Low Genus Surfaces

    Get PDF
    Consider a surface SS and let M⊂SM\subset S. If S∖MS\setminus M is not connected, then we say MM \emph{separates} SS, and we refer to MM as a \emph{separating set} of SS. If MM separates SS, and no proper subset of MM separates SS, then we say MM is a \emph{minimal separating set} of SS. In this paper we use methods of computational combinatorial topology to classify the minimal separating sets of the orientable surfaces of genus g=2g=2 and g=3g=3. The classification for genus 0 and 1 was done in earlier work, using methods of algebraic topology.Comment: 24 pages, 5 figures, 2 tables (11 pages

    Learning High-Dimensional Nonparametric Differential Equations via Multivariate Occupation Kernel Functions

    Full text link
    Learning a nonparametric system of ordinary differential equations (ODEs) from nn trajectory snapshots in a dd-dimensional state space requires learning dd functions of dd variables. Explicit formulations scale quadratically in dd unless additional knowledge about system properties, such as sparsity and symmetries, is available. In this work, we propose a linear approach to learning using the implicit formulation provided by vector-valued Reproducing Kernel Hilbert Spaces. By rewriting the ODEs in a weaker integral form, which we subsequently minimize, we derive our learning algorithm. The minimization problem's solution for the vector field relies on multivariate occupation kernel functions associated with the solution trajectories. We validate our approach through experiments on highly nonlinear simulated and real data, where dd may exceed 100. We further demonstrate the versatility of the proposed method by learning a nonparametric first order quasilinear partial differential equation.Comment: 22 pages, 3 figures, submitted to Neurips 202

    Learning nonparametric ordinary differential equations from noisy data

    Full text link
    Learning nonparametric systems of Ordinary Differential Equations (ODEs) dot x = f(t,x) from noisy data is an emerging machine learning topic. We use the well-developed theory of Reproducing Kernel Hilbert Spaces (RKHS) to define candidates for f for which the solution of the ODE exists and is unique. Learning f consists of solving a constrained optimization problem in an RKHS. We propose a penalty method that iteratively uses the Representer theorem and Euler approximations to provide a numerical solution. We prove a generalization bound for the L2 distance between x and its estimator and provide experimental comparisons with the state-of-the-art.Comment: 25 pages, 6 figure

    Learning Nonparametric Ordinary Differential Equations: Application to Sparse and Noisy Data

    Get PDF
    Learning nonparametric systems of Ordinary Differential Equations (ODEs) xË™=f(t,x) from noisy and sparse data is an emerging machine learning topic. We use the well-developed theory of Reproducing Kernel Hilbert Spaces (RKHS) to define candidates for f for which the solution of the ODE exists and is unique. Learning f consists of solving a constrained optimization problem in an RKHS. We propose a penalty method that iteratively uses the Representer theorem and Euler approximations to provide a numerical solution. We prove a generalization bound for the L2 distance between x and its estimator. Experiments are provided for the FitzHugh Nagumo oscillator and for the prediction of the Amyloid level in the cortex of aging subjects. In both cases, we show competitive results when compared with the state of the art
    corecore